# 12 language support
Granite 4.0 Tiny Base Preview
Apache-2.0
Granite-4.0-Tiny-Base-Preview is a 7-billion parameter Mixture of Experts (MoE) language model developed by IBM, featuring a 128k token context window and enhanced expressive capabilities through Mamba-2 technology.
Large Language Model
Transformers

G
ibm-granite
156
12
Llama 4 Scout 17B 16E Unsloth Dynamic Bnb 4bit
Other
Llama 4 Scout is Meta's 17-billion parameter Mixture of Experts multimodal model supporting 12 languages and image understanding
Multimodal Fusion
Transformers Supports Multiple Languages

L
unsloth
1,935
2
Upos Multi Fast
A fast multilingual universal POS tagging model provided by Flair, supporting POS tagging tasks in 12 languages.
Sequence Labeling Supports Multiple Languages
U
flair
226
5
Featured Recommended AI Models